Last Uncertainty Wednesday I laid out the basic framework for understanding uncertainty: we use observations and explanations to have knowledge of reality. The degree of uncertainty we are facing results from the limits to our knowledge. Now I will start to tackle the limits to observations. These fall into four groups: the nature of observations (today’s post), observational (aka measurement) error, the cost of observations and the impact of observations on reality.
Observations are just that, observations. This sounds tautological but is important to keep in mind at all times. Observations are not the underlying reality itself. They are the result of a sequence of processes that results in data points. These data points are a summary of reality.
Take temperature observations as an example. At the most primitive we can touch an object and use our sense of touch to feel if the object is hot or cold. That way of observing temperature has obvious severe limitations. Our temperature sense is subjective so you and I may disagree on whether an object is cold or not. Another limitation is that our skin can only be exposed to a relatively small temperature range without being injured. And so on. To address these issues, we have invented sophisticated thermometers as measurement devices for temperature. We have also figured out how to use the light emanating from an object as a way of observing its temperature.
But we should not fool ourselves into taking these measurements of temperature as anything other than observations. They are not the reality of the object. They are not “das Ding an sich” so to say. What we observe as temperature is the motion of the atoms that make up the object. Even for relatively simple objects that means billions of atoms. When we observe temperature we are not recording the state of each atom but an aggregate behavior that emerges from these individual states.
Temperature thus is a massive simplification or summary of the underlying reality itself. That is precisely what makes temperature observations helpful but is also why there is some uncertainty that has snuck in: to the extent that the distribution of heat across atoms matters, the summary does not capture it. As we will see later in some cases that’s perfectly fine and in others that results in large uncertainty (going deeper here would be getting ahead of ourselves).
The key point to take away is the following: If we want to reason about reality our observations *must* condense reality. The classic example to illustrate this idea are maps. Every useful map omits features of the actual terrain. The only map that doesn’t do that would be a 1:1 map, which isn’t a map, it is the terrain itself. Every set of observations in order to be useful must be smaller than the underlying reality which we seeks to understand. And that very act of summarizing is what introduces a foundational limit of observations and hence uncertainty.
One of the questions then we will be dealing with in this series is how much uncertainty results in different situations from this fundamental limitation of observations (and of course from the other limits to be discussed in upcoming posts).
I would be remiss if I didn’t point out that this fundamental limitation to observations would not exist under one condition: that the *perfect* explanation for everything is captured by a mathematical formula. You may recall the example of the logistic map from the introductory post. The extreme view of this is captured in Max Tegmark’s “Our Mathematical Universe” which speculates that our reality *is* a mathematical object. Of course if reality is a mathematical object then it is perfectly represented (explained) by that object and observations of the parameters of that object would be the thing itself. Personally I am much more in the camp of Lee Smolin and Roberto Unger, who in “The Singular Universe and the Reality of Time” argue that mathematics is and always will be an approximation of reality.